automated story generation
Event Representations for Automated Story Generation with Deep Neural Nets
Martin, Lara J. (Georgia Institute of Technology) | Ammanabrolu, Prithviraj (Georgia Institute of Technology) | Wang, Xinyu (Georgia Institute of Technology) | Hancock, William (Georgia Institute of Technology) | Singh, Shruti (Georgia Institute of Technology) | Harrison, Brent (Georgia Institute of Technology) | Riedl, Mark O. (Georgia Institute of Technology)
Automated story generation is the problem of automatically selecting a sequence of events, actions, or words that can be told as a story. We seek to develop a system that can generate stories by learning everything it needs to know from textual story corpora. To date, recurrent neural networks that learn language models at character, word, or sentence levels have had little success generating coherent stories. We explore the question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity. We present a technique for preprocessing textual story data into event sequences. We then present a technique for automated story generation whereby we decompose the problem into the generation of successive events (event2event) and the generation of natural language sentences from events (event2sentence). We give empirical results comparing different event representations and their effects on event successor generation and the translation of events to natural language.
- South America > Argentina > Pampas > Buenos Aires F.D. > Buenos Aires (0.04)
- North America > United States > Nebraska (0.04)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- North America > United States > Colorado (0.04)
Toward Automated Story Generation with Markov Chain Monte Carlo Methods and Deep Neural Networks
Harrison, Brent (Georgia Institute of Technology) | Purdy, Christopher (Georgia Institute of Technology) | Riedl, Mark O. (Georgia Institute of Technology)
In this paper, we introduce an approach to automated story generation using Markov Chain Monte Carlo (MCMC) sampling. This approach uses a sampling algorithm based on Metropolis-Hastings to generate a probability distribution which can be used to generate stories via random sampling that adhere to criteria learned by recurrent neural networks. We show the applicability of our technique through a case study where we generate novel stories using an acceptance criteria learned from a set of movie plots taken from Wikipedia. This study shows that stories generated using this approach adhere to this criteria 85%-86% of the time.
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.89)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.69)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.60)
Event Representations for Automated Story Generation with Deep Neural Nets
Martin, Lara J., Ammanabrolu, Prithviraj, Wang, Xinyu, Hancock, William, Singh, Shruti, Harrison, Brent, Riedl, Mark O.
Automated story generation is the problem of automatically selecting a sequence of events, actions, or words that can be told as a story. We seek to develop a system that can generate stories by learning everything it needs to know from textual story corpora. To date, recurrent neural networks that learn language models at character, word, or sentence levels have had little success generating coherent stories. We explore the question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity. We present a technique for preprocessing textual story data into event sequences. We then present a technique for automated story generation whereby we decompose the problem into the generation of successive events (event2event) and the generation of natural language sentences from events (event2sentence). We give empirical results comparing different event representations and their effects on event successor generation and the translation of events to natural language.
- South America > Argentina > Pampas > Buenos Aires F.D. > Buenos Aires (0.04)
- North America > United States > Nebraska (0.04)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- North America > United States > Colorado (0.04)
Dear Leader’s Happy Story Time: A Party Game Based on Automated Story Generation
Horswill, Ian D. (Northwestsern University)
Players in Dear Leader’s Happy Story Time are placed in the role of contestants in a reality TV show where they are forced to audition for roles in the upcoming film of the host, a deranged billionaire who has inexplicably been elected president. The stories are produced by a story generator that combines stock plots and characters to produce kitsch story outlines. The players then collaborate to improvise a camp performance of the outline. The game design provides a context for experimenting with automatic story generation within a narrative game, as well as an opportunity for experimenting with knowledge representation schemes for expressing the tropes of popular narrative. The story generator uses a higher-order logic for describing tropes, and an HTN planning algorithm based on Nau et al.’s SHOP.
- Leisure & Entertainment (1.00)
- Media > Television (0.53)